This page last changed on Dec 18, 2007 by imoncada.

This documents explains briefly the steps to follow to use the OTRubric framework to display a student report (including grading) of an activity given basic rules to calculate grades with different indicators.

Assessment and reporting process overview

In this framework, in order to assess student work and produce a quantitative report, the following steps take place:

  • Step 1. The activity (script) logs the goal of the activity and everything the student does (all-events log).
  • Step 2. From the general all-events log, a script interprets the relevant information and processes it, doing all necessary calculations in order to produce learning indicators that can be quantified (graded) individually. Note: Generally, these indicators are designed based on the assessment needs and they are expressed into a rubric, which basically contains a collection of indicators (or "relevant things to pay attention to") with a grading system that contains points to be assigned per indicator according to the different relevant possibilities that the student can create.
  • Step 3. From the individual indicator values, a script takes the rubric and applies it to the student work, producing a report that contains the total grade of the student and also includes details on the grading of each indicator.

The following documentation talks about Steps 2 and 3. Step 1, and the whole all-events log is skipped right now since it's easier for the activity script to go Step 2 right away and generate directly the indicator values instead of going through an all-event log first (which would need to be parsed anyway by another script).

In order to create a report out of your activity, you need to implement Step 2, and you also need to design a rubric that matches your indicators. Step 3 is done automatically for you by Java classes already written in the framework. The framework also contains a basic UI to edit the rubric.

How to create a report of an activity

Design a rubric

Think of the useful learning indicators that you want to consider when grading the student work. Make a list of each indicator and all the possible values that you want to consider in each indicator.
Then, you will have to express the rubric in an otml file using the classes in the org.concord.otrunkcapa.rubric package.

For example, let's say we have an activity that asks a student to perform a simple sum of two numbers.
In order to grade the activity, probably, you would like to know first if the student answer correctly. Maybe you also want to know how many different answers the student wrote down before he decided to submit the answer definitively. Another useful indicator might be how long did the stduent take answering the question.
So, your rubric otml file could look like this:

<OTRubric local_id="rubric_object" name="Sum of two numbers rubric">
  <indicators>
    <OTRubricIndicator name="answerCorrect" label="Answer" description="Whether the answer submitted was correct">
      <possibleValues>
        <OTRubricIndicatorValue value="0" category="Bad" label="Bad" description="Incorrect value" points="0"/>
        <OTRubricIndicatorValue value="1" category="Good" label="Good" description="Correct value" points="10"/>
      </possibleValues>
    </OTRubricIndicator>
    <OTRubricIndicator name="numberAnswers" label="Number of answers" description="How many times the student answered">
      <possibleRanges>
      <!--    range means min < x <= max    -->
        <OTRubricIndicatorRange minValue="0" maxValue="1" category="Good" label="" description="Only one value written down" points="5"/>
        <OTRubricIndicatorRange minValue="1"              category="Bad" label="" description="More than one value written down" points="2"/>
      </possibleRanges>
    </OTRubricIndicator>
    <OTRubricIndicator name="time" label="Time (s)" description="Time the student took to submit the answer (in seconds)" showValue="true">
      <possibleRanges>
      <!--    range means min < x <= max    -->
        <OTRubricIndicatorRange                maxValue="60"  category="Good" label="" description="Answer submitted under a minute" points="5"/>
        <OTRubricIndicatorRange minValue="60"  maxValue="300" category="Ok" label="" description="Answer submitted between 1 and 5 minutes" points="3"/>
        <OTRubricIndicatorRange minValue="300"                category="Bad" label="" description="Student took more than 5 minutes" points="0"/>
      </possibleRanges>
    </OTRubricIndicator>
  </indicators>
</OTRubric>

This sample rubric has the three indicators mentioned above. Once all the indicators have been designed in the rubric, there is a UI that can be used to change the points per indicator. In this particular example, the points total to 20 points, and 1/2 of them are derived from whether the answer is correct or not.
The first indicator (whether the answer is correct) has only two possible values: incorrect or correct. We could consider a middle ground possibility, for example, if the student is "almost" correct (for example, if the student forgot a zero, had the sign wrong, etc).

Write a script that generates a value for each indicator in your rubric

In order to be able to use a rubric, the activity script will have to generate an assessment object (OTAssessment) that contains one value per indicator in the rubric.
Generally, the script will add this object to the contents section of the OTScriptObject.
To follow the previous example, this would be a script code that can be written to generate indicator values for the sample rubric we wrote:

//Create an assessment object
var otAssessment = otObjectService.createObject(OTAssessment);
//Add the new assessment object to the contents of the script object
otContents.add(otAssessment);

//Add the indicators one by one
if (answerIsCorrect()){         //Assuming there is a function in the script that returns whether the answer was correct or not
  //Correct answer
  otAssessment.getIndicatorValues().put("answerCorrect", new java.lang.Integer(1));
}
else{
  //Incorrect answer
  otAssessment.getIndicatorValues().put("answerCorrect", new java.lang.Integer(0));
}
otAssessment.getIndicatorValues().put("numberAnswers", numberAnswers);    //Assuming we have a variable in the script that records how many answers have been written
otAssessment.getIndicatorValues().put("time", time);    //Assuming we have a variable in the script that records how long the student took

It is your job (the script's job) to perform the right calculations to determine the value of each indicator (whether the answer is correct, how long did the student take, etc). Use the indicator name as the key in the value map for the assessment object.

Display the report

In order to show a report, use the OTAssessmentView class as the OT view for an OTAssessment object. This view will need a view config object (OTAssessmentViewConfig) that provides the OTRubric object to use.
This is an example of the otml needed to use this view:

<OTAssessmentViewConfig
  objectClass="org.concord.otrunkcapa.OTAssessment"
  viewClass="org.concord.otrunkcapa.OTAssessmentView">
  <rubric>
    <object refid="rubric_object"/>
  </rubric>
</OTAssessmentViewConfig>

In this case, the view config is referencing the OTRubric object using its local id, assuming the rubric is defined in the same otml file.
If the rubric is defined in a separate otml file, you can include the external otml file using an OTInclude, and then reference the rubric using its global id (unique uuid) instead of the local id.

To display this report, you can use the report button which automatically pops up showing the contents section of a script object, so that will include your new OTAssessment object.

Document generated by Confluence on Jan 27, 2014 16:52